From Nieman Reports: The powers and perils of news personalization

A new era of personalized news products began, in earnest, as a reaction to horrific global news.

Today, a Google search for news runs through the same algorithmic filtration system as any other Google search: A person’s individual search history, geographic location, and other demographic information affects what Google shows you. Exactly how your search results differ from any other person’s is a mystery, however. Not even the computer scientists who developed the algorithm could precisely reverse engineer it, given the fact that the same result can be achieved through numerous paths, and that ranking factors — deciding which results show up first — are constantly changing, as are the algorithms themselves.

We now get our news in real time, on-demand, tailored to our interests, across multiple platforms, without knowing just how much is actually personalized. It was technology companies like Google and Facebook, not traditional newsrooms, that made it so. But news organizations are increasingly betting that offering personalized content can help them draw audiences to their sites—and keep them coming back.

Personalization extends beyond how and where news organizations meet their readers. Already, smartphone users can subscribe to push notifications for the specific coverage areas that interest them. On Facebook, users can decide — to some extent — which organizations’ stories they would like to appear in their news feeds. At the same time, devices and platforms that use machine-learning to get to know their users will increasingly play a role in shaping ultra-personalized news products. Meanwhile, voice-activated artificially intelligent devices, such as Google Home and Amazon Echo, are poised to redefine the relationship between news consumers and the news.

While news personalization can help people manage information overload by making individuals’ news diets unique, it also threatens to incite filter bubbles and, in turn, bias. This “creates a bit of an echo chamber,” says Judith Donath, author of The Social Machine: Designs for Living Online and a researcher affiliated with Harvard’s Berkman Klein Center for Internet & Society. “You get news that is designed to be palatable to you. It feeds into people’s appetite of expecting the news to be entertaining … [and] the desire to have news that’s reinforcing your beliefs, as opposed to teaching you about what’s happening in the world and helping you predict the future better.”

As data-tracking becomes more sophisticated, voice recognition software advances, and tech companies leverage personalization for profit, personalization will only become more acute. This is potentially alarming given the growth of websites — news-oriented and otherwise —inhabiting the political extremes, which on Facebook are easy to mistake for valid sources of news. When users can customize their news, and customize to these political and social extremes, civic discourse can suffer. “What’s important is how people use the news to have a discussion,” says Donath. “You may have friends or colleagues, and you read the same things in common. You may decide different things about it. Then you debate with those people. If you’re not even seeing the same news story, it leaves you with a much narrower set of people with whom you share that common ground. You’re losing the common ground of news.”

Keep reading at Nieman Reports →

U.S. Air Force photo by Senior Airman Elisa Labbe [Public domain], via Wikimedia Commons.

10 principles for data journalism in its second decade

10 principles Data journalism

In 2007 Bill Kovach and Tom Rosenstiel published The Elements of Journalism. With the concept of ‘journalism’ increasingly challenged by the fact that anyone could now publish to mass audiences, their principles represented a welcome platform-neutral attempt to articulate exactly how journalism could be untangled from the vehicles that carried it and the audiences it commanded.

In this extract from a forthcoming book chapter* I attempt to use Kovach and Rosenstiel’s principles (outlined in part 1 here) as the basis for a set that might form a basis for data journalism as it enters its second and third decades.

Principle 1: Data journalists should strive to interrogate data as a power in its own right

When data journalist Jean-Marc Manach set out to find out how many people had died while migrating to Europe he discovered that no EU member state held any data on migrants’ deaths. As one public official put it, dead migrants “aren’t migrating anymore, so why care?

Similarly, when the BBC sent Freedom of Information requests to mental health trusts about their use of face-down restraint, six replied saying they could not say how often any form of restraint was used — despite being statutorily obliged to “document and review every episode of physical restraint which should include a detailed account of the restraint” under the Mental Health Act 1983.

The collection of data, the definitions used, and the ways that data informs decision making, are all exercises of power in their own right. The availability, accuracy and employment should all be particular focuses for data journalism as we see the expansion of smart cities and wearable technology.

Principle 2: Editorial independence includes technological independence

I wrote in 2013 about the role of coding in ensuring editorial independence, quoting Lawrence Lessig‘s point, made over a decade ago, that code is law:

“Ours is the age of cyberspace. It, too, has a regulator. This regulator, too, threatens liberty. But so obsessed are we with the idea that liberty means “freedom from government” that we don’t even see the regulation in this new space. We therefore don’t see the threat to liberty that this regulation presents.

“This regulator is code—the software and hardware that make cyberspace as it is. This code, or architecture, sets the terms on which life in cyberspace is experienced. It determines how easy it is to protect privacy, or how easy it is to censor speech. It determines whether access to information is general or whether information is zoned. It affects who sees what, or what is monitored. In a host of ways that one cannot begin to see unless one begins to understand the nature of this code, the code of cyberspace regulates.” (Lessig 2006)

The independence of the journalist is traditionally portrayed as possessing the power to resist pressure from our sources, our bosses and business models, and the government and law enforcement. But in a networked age it will also mean independence from the biases inherent in the tools that we use.

From the content management systems that we use, to the mobile devices that record our every move, independence in the 21st century will be increasingly facilitated by being able to ‘hack’ our tools or build our own.

Code affects what information you can access, your ability to verify it, your ability to protect sources — and your ability to empower them. Finally, code affects your ability to engage users.

Code is a key infrastructure that we work in as journalists: if we understand it, we can move across it much more effectively. If it is invisible to us, we cannot adapt it, we cannot scrutinise it. We are, in short, subject to it.

Principle 3: We should strive for objectivity not just in the sources and language that we use, but also the way that we design our tools

europe from moscow

Mapping tools assume a default point of view. Image: Time Magazine via Newberry Library via Jake Ptacek

In data journalism right now we are at a crucial stage: the era during which we move from making stories and tools for other people, to making our own tools.

As John Culkin, in writing about Marshall McLuhan, said:

“We shape our tools, and thereafter they shape us”.

The values which we embed in those tools, the truths we take for granted, will have implications beyond our own generation.

The work of Lawrence Lessig and Nicholas Diakopoulos highlights the role that code plays in shaping the public lives that we can lead; we need to apply the same scrutiny to our own processes.

When we build tools on maps do we embed the prejudices that have been identified by critical cartographers?

Do we seek objectivity in the visual language we use as well as the words that we choose?

But it is not just the tools which will shape our practice: the reorganisation of newsrooms and the creation of data desks and the data journalist’s routine will also begin to draw the borders of what is considered normal in – and what is considered outside of – the role of the data journalist.

Uskali and Kuutti, for example, already identify at least three different models for organising data journalism work practices: data desks, flexible data projects, and the entrepreneur or sub-contractor model. To what extent these models circumscribe or provide opportunities for new ways of doing journalism bears some reflection.

If we are to augment our journalism, we must do so critically.

Principle 4: Impartiality means not relying only on stories where data exists and is easy to obtain

The increasing abundance of data brings with it a new danger: that we do not look beyond what is already accessible, or that we give up too easily if a story does not seem practical.

Just as the expansion of the PR industry in the 20th century led to accusations of ‘churnalism’ in the media, the expansion of data in the 21st century risks leading to ‘data churnalism’ instead of data journalism, including the use of automation and dashboards as a way of dealing with those accessible sources.

Principle 5: We should strive to give a voice to those who are voiceless in data by seeking to create or open up data which would do so

Head icons

When The Guardian’s ‘The Counted’ project sought to report on people killed by police in the US, it was literally seeking to ‘give a voice to the voiceless’ — because those people were dead; they could not speak.

The Bureau of Investigative Journalism‘s Naming the Dead project had a similar objective: tracking and investigating US covert drone strikes since 2011 and seeking to identify those killed.

Neither is an example of data journalism that uses coding: the skills are as basic as keeping a record of every report you can find. And yet this basic process has an important role at the heart of modern journalism: digitising that which did not exist in digital form before: moving from zeroes to ones. You can find more examples in this post about the approach in 2015:

“Too often data journalism is described in ways that focus on the technical act of working with existing data. But to be voiceless often means that no data exists about your experience.”

Principle 6: We retain editorial responsibility for context and breadth of coverage where we provide personalisation

If journalism must provide a forum for public criticism and compromise, what role does personalisation — which gives each person a different experience of the story — play in that?

Some would argue that it contributes to ‘filter bubbles’ whereby people are unaware of the experiences and opinions of people outside of their social circle. But it can also bring people in to stories that they would otherwise not read at all, because those stories would otherwise have no relevance to their lives.

As data journalists, then, we have a responsibility to consider the impact of personalisation and interactivity both in making news relevant to readers, and providing insights into other dimensions of the same story which might not be so directly relevant.

This, of course, has always been journalism’s skill: after all, human interest stories are the ‘universal’ hook that often draws people in to the significant and important.

Principle 7. We should strive to keep the significant interesting and relevant by seeking to find and tell the human story that the data shines a spotlight on

For the same reason, we should ensure that our stories are not merely about numbers, but people. I always tell my MA Data Journalism students that a good story should do two things: tell us why we should care, and tell us why it matters.

Data helps us to establish why a story matters: it connects one person’s story to 100 others like it; without data, a bad experience is merely an anecdote. But without a human story, data becomes just another statistic.

Principle 8. The algorithms in our work – both human and computational – should be open to scrutiny, and iteration

The more that journalism becomes augmented by automation, or facilitated by scripts, the more that we should consider being open to public scrutiny.

If we are unable to explain how we arrived at a particular result, that undermines the credibility of the conclusion.

Diakopoulos and Koliska, who have explored algorithmic transparency in the news media, conclude that it is an area much in need of research, development and experimentation:

“There are aspects of transparency information that are irrelevant to an immediate individual user context, but which are nonetheless of importance in media accountability for a broad public such as fair and uncensored access to information, bias in attention patterns, and other aggregate metrics of, for instance, error rates. In other words, some factors may have bearing on an individual whereas others have import for a larger public. Different disclosure mechanisms, such as periodic reports by ombudspeople may be more appropriate for factors like benchmarks, error analysis, or the methodology of data collection and processing, since they may not be of interest to or even comprehensible for many users yet demanded by those who value an expert account and explanation.”

Principle 9. Sharing our code also allows us to work more efficiently and raise standards

buzzfeed github

It has often been said that transparency is the new objectivity in this new world of limitless publishing. This both recognises that while true objectivity does not exist transparency can help establish what steps we have taken towards coming as close as we can to it.

The AP Stylebook‘s section on data journalism has formally recognised this with its reference to reproducible analysis:

“Providing readers with a roadmap to replicate the analysis we’ve done is an essential element of transparency in our reporting. We can accomplish this transparency in many ways, depending on the data set and the story”

But what is transparency for data journalists? Jennifer Stark and Nicholas Diakopoulos outline principles from scientific research that can be adapted – specifically reproducibility and replicability.

Reproducibility involves making code and data available so a user can rerun the original analysis on the original data. “This is the most basic requirement for checking code and verifying results”

Replicability,  on  the  other  hand,  “requires  achieving  the  same outcome  with independent data  collection,  code  and  analysis.  If the same  outcome  can  be  achieved  with  a  different  sample, experimenters  and analysis software,  then  it  is  more  likely  to  be true.”

Currently the code-sharing site GitHub is the place where many data teams share their code so that others can reproduce their analysis. It is incredible to look across the GitHub repositories of FiveThirtyEight or BuzzFeed and understand how the journalism was done. It also acts as a way to train and attract future talent into the industry, either formally as employees, or informally as contributors.

Principle 10. We should seek to empower citizens to exercise their rights and responsibilities

new york times you draw it

The New York Times You Draw It challenges users to take an active role

The final principle mirrors Kovach and Rosenstiel’s: the obligation on the public to take some responsibility for journalism too. And it is here, perhaps, where data journalism has the most significant role to play.

Because where Kovach and Rosenstiel put the onus on the public, I believe that data journalism is well positioned to do more, and to actively empower that public to exercise those rights and responsibilities.

A New York Times interactive which invites the user to draw a line chart before revealing how close they were to the true trend is precisely the sort of journalism which helps users engage with their own role in negotiating information.

A tool which allows you to write to your local representative, or to submit a Freedom of Information request, is one which positions the reader not as a passive consumer of news, but as an active participant in the world that they are reading about.

In print and on air we could only arm our audiences with information, and hope that they use it wisely. Online we can do much more — and we’ve only just begun.

*You can read all three extended extracts from the book chapter under the tag ‘Next Wave of Data Journalism’ here.

Filed under: online journalism Tagged: algorithms, AP Stylebook, data churnalism, Jean-Marc Manach, Jennifer Stark, John Culkin, lawrence lessig, Michael Koliska, next wave of data journalism, Nicholas Diakopoulos, objectivity

Computational thinking and the next wave of data journalism

In this second extract from a forthcoming book chapter I look at the role that computational thinking is likely to play in the next wave of data journalism — and the need to problematise that. You can read the first part of this series here.

Computational thinking is the process of logical problem solving that allows us to break down challenges into manageable chunks. It is ‘computational’ not only because it is logical in the same way that a computer is, but also because this allows us to turn to computer power to solve it.

As Jeannette M. Wing puts it:

“To reading, writing, and arithmetic, we should add computational thinking to every child’s analytical ability. Just as the printing press facilitated the spread of the three Rs, what is appropriately incestuous about this vision is that computing and computers facilitate the spread of computational thinking.”

This process is at the heart of a data journalist’s work: it is what allows the data journalist to solve the problems that make up so much of modern journalism, and to be able to do so with the speed and accuracy that news processes demand.

It is, in Wing’s words, “conceptualizing, not programming” and “a way that humans, not computers, think.”

“Computers are dull and boring; humans are clever and imaginative. We humans make computers exciting. Equipped with computing devices, we use our cleverness to tackle problems we would not dare take on before the age of computing and build systems with functionality limited only by our imaginations” (Wing 2006)

And it is this – not coding, or spreadsheets, or visualisation – that I believe distinguishes the next wave of journalists. Skills of decomposition (breaking down into parts), pattern recognition, abstraction and algorithm building that schoolchildren are being taught right now. Imagine what mass computational literacy will do to the news industry.

Nicholas Diakopoulos‘s work on the investigation of algorithms is just one example of computational thinking in practice. In his Tow Center report on algorithmic accountability he outlines an approach to reverse-engineer the ‘black boxes’ that shape how we experience an increasingly digitised world:

“Algorithms must always have an input and output; the black box actually has two little openings. We can take advantage of those inputs and outputs to reverse engineer what’s going on inside. If you vary the inputs in enough ways and pay close attention to the outputs, you can start piecing together a theory, or at least a story, of how the algorithm works, including how it transforms each input into an output, and what kinds of inputs it’s using. We don’t necessarily need to understand the code of the algorithm to start surmising something about how the algorithm works in practice.”

Problematising computationality

Infoamazonia

Infoamazonia is one of a number of projects seeking to make environmental crime ‘visible’. Image from Geojournalism

But the next wave of data journalism cannot just solve the new technical problems that the industry faces: it must also “problematise computationality”, to use the words of David M. Berry:

“So that we are able to think critically about how knowledge in the 21st century is transformed into information through computational techniques, particularly within software.”

His argument relates to the role of the modern university in a digital society, but the same arguments can be made about journalism’s role too:

“The digital assemblages that are now being built … provide destablising amounts of knowledge and information that lack the regulating force of philosophy — which, Kant argued, ensures that institutions remain rational.

“… There no longer seems to be the professor who tells you what you should be looking up and the ‘three arguments in favour of it’ and the ‘three arguments against it’.”

This is not to argue for the reintroduction of gatekeepers, but to highlight instead that information is not neutral, and it is the role of the journalist – just as it is the role of the educator – to put that information into context.

Crime mapping is one particularly good example of this. What can be more straightforward than placing crimes on a map? As Theo Kindynis writes of crime mapping, however:

“It is increasingly the case that it simply does not make sense to think about certain types of crime in terms of our conventional notions of space. Cybercrime, white-collar financial crime, transnational terrorism, fraud and identity theft all have very real local (and global) consequences, yet ‘take place’ within, through or across the ‘space of flows’ (Castells 1996). Such a-spatial or inter-spatial crime is invariably omitted from conventional crime maps.” (Kindynis 2014)

All this serves to provide some shape to the landscape that we are approaching. To navigate it we perhaps need some more specific principles of our own to help.

In the third and final part of this series, then, I want to attempt to build on Kovach and Rosenstiel’s work with principles which might form a basis for data journalism as it enters its second and third decades.

You can read all three extracts under the tag ‘Next Wave of Data Journalism’ here.

Filed under: online journalism Tagged: algorithms, computational thinking, data journalism, David M. Berry, Jeannette M. Wing, next wave of data journalism, Nicholas Diakopoulos, Theo Kindynis

From tech to ethics: Will AI be a threat to journalism?

The advent of highly accessible technology ushered in a new Modern era, and with automation, robots and Artificial Intelligence as buzzwords of choice for the media in recent times, what will the role of Human Journalism be in this Digital Modernity?

Christoph Thun-Hohenstein, Director of MAK, the Austrian Museum of Applied/Contemporary Art in Vienna, discusses the expectations and dangers surrender this newfound modernity and introduce the ‘Hello, Robot’ exhibition the GEN Summit attendees will get to enjoy for the first social event on Wednesday 21 June, at the close of the first day of the conference.

Richard Vijgen, Architecture of Radio, 2015 © 2016 Richard Vijgen Studio/Photo: Juuke Schoorl

GEN: For the media specialists attending the GEN Summit and visiting the exhibition, how is MAK approaching the topics of digital innovation, automation and robotics?

Christoph Thun-Hohenstein: The big picture is that the world is currently divided in two different movements: one is Dataism, a movement based on the belief that the ideal state of the world is not one governed by humans, but by data, as Big Data. To reflect on these assumptions, it is considered that data is constantly parsed, updated, analysed and algorithmically optimised, and that humans don’t play a pivotal role in it anymore. The human component of this data – in the broader sense of the term – has become redundant.

The other movement is the so-called “Techno-humanism”, which is about the upgrading of humans: intellectually, mentally and physically.

In both directions, and behind those movements, lies quite a lot of power and money. These movements could be useful in some of their teachings, but they are not anything to aspire to as a whole. We should instead focus more on the kind of future we want as humans, as biological beings with consciousness, on the evolution we want to achieve. Obviously journalism plays a big part in all of this, as a means to carry this debate, expose the informations and also give the general public of the needed perspective on the times we are living and where the technology we are embracing could be leading us to.

As self-learning digital machines are getting better at mimicking the human touch, our minds are getting accordingly adept at understanding and wholeheartedly accepting algorithms and digital systems taking over some parts of our lives.

We can imagine that more and more of our other human capabilities will be discarded, will not be used anymore if our mindsets are geared to work as digital partners with machines. How can we initiate a pro-human approach? How can we set a human agenda making us aware of what is happening, enabling us to use the parts of digital innovation that are really helpful and renounce the ones driving us in a direction we do not want to follow, if only we had the knowledge of the repercussion it would have for us in the future. In a nutshell, this is what the ‘Hello, Robot’ exhibition is all about.

What does it all mean for the media? What can journalists take away from this?

Obviously a lot is changing in the media right now, it is hard enough to know what it will look like as an industry in 5 years times. In my mind, we have barely started a discussion in the industry and with audiences, on where this new modernity will lead us. And it is important, as more automation, devices, algorithms, with the incremental innovation we get almost every day with upgrades, come into our lives. What options do we have? What influences are at play here? What about the money behind the digital monopolies being cemented as we speak, and where is this money leading to? with all the incremental innovation we get almost every day with upgrades.

There is quite a lot of research being conducted on developing a ‘superintelligence’: an artificial intelligence able to outsmart humans because it would be far superior to human intellectual capabilities, wherein lies countless possibilities. This is a point that will become more prominent in the next 30 to 40 years. Such a super intelligence is totally possible, is actually plausible.

So, what is needed for the media is to deal with the main aspects of digital modernity, with the possible outcomes of current digital developments and the implications these technologies would have on people’s lives, not only in 18 months time, but also in a few decades. The general public needs to understand the impact of robotics, artificial intelligence and also bio technology. This, I think the first big imperative we need to tackle.

What role can the media and news play in discussing the ethics issue with technology and automation?

Since digital programs will get increasingly smarter and prevalent, accounting for their self-learning capabilities, the media will make an impact as it makes the processes and stakes obvious to the audience. The question then will be whether we are able to discuss and collectively decide on a future we want to live in and plan for. There will hopefully be a scope for celebrating human qualities that cannot be calculated and catered for by digital programs, by algorithms. We should also want part of our lives to resonate.

Mia Meusburger and Johanna Pichlbauer, a visual display of summer’s status from Vienna Summer Scouts, 2014 © Mia Meusburger, Johanna Pichlbauer / ID2 Studio / University of Applied Arts, Vienna

As we live in a very fast world, we have more or less come to terms with this, we all organise or live with and by the terms of our smartphones. But I think we also need to reserve moments in our daly lives to get different content, different offerings.

We need to talk about ethics and technology, also about shaping the future, and what the options are. I think one task of the media is to dig, to expand on this topic and its potential repercussions – and I have to say that in the last year compared to 10 years ago, there has been tremendous progress as more publications are covering ethics in digital innovation now. But I doubt that this reflection is broader, gets further. Journalists have to become much more fundamental about this, to grasp the issue in its entirety, and for the general public to understand what is at stake.

A big part of what I believe and hope for is that the media has a big role to play in dealing with the most important dimensions of our new digital modernity. Because what is happening currently is a lot of attention turned towards brand new slices of innovation, disruptive innovation or other, but you rarely get across articles with the whole phenomenon of disruption put into perspective.

What do you think is sort of the biggest pitfall from this issue. If we don’t have this discussion about machines are we talking about what many scientists have pointed out which is that AI I might endanger the human species at some point. What are the dangers of this phenomenon?

Some scientists have been talking about this, a big wave of automation – more so than what exists now. Because the system works in such a way that as soon as one company starts to go down this of partial-to-full automation, when technology permits it, all the other companies in the same field have to do the same. This is the law of the market. When this happens, it will be big, there will be waves of innovation. Look at what happened to the service industry, banks or insurance companies, a large chunk of the workforce is being done away with, as they are becoming fully automated.

As long as we still have options, we need to be able to understand these issues and what they entail, so we can make the conscious choice to go all the way or into a completely different direction altogether.

Researchers working on artificial intelligence want to achieve progress and break new ground. They will not stop until they see the premise of a “superintelligence”, in the sense of artificial intelligence being much smarter than humans. There is nothing that will stop them from doing it.

We do not have a discussion broad enough about these issues currently but it is high time to do so. It has started, especially in the past year as I said, but it has to be much more systematic, this conversation has to have a much stronger presence everywhere.

Robotlab (ZKM), Manifest, 2008 © robotlab/ZKM

What would it take for the media to be more proactive on this front?

Not to be hypocritical, as algorithms are becoming important in journalism, and they will develop more and more, but to be precise and determined. A line needs to be drawn, and it needs to be clear that there still is an important scope for “human journalism” because even the most advanced self-learning machines cannot achieve what it can.

True storytelling is always going to be taken care of by human hands and a human brain, in one way or another: the voice of a piece is too important, so is the emotional approach, which might condition the reach and audience engagement.

Machine-learning really means that information can be processed, and replicated on the same par, quality-wise. And while machines can mimic with a similar apparent value, they cannot yet be creative and come up with something brand new, a new narrative. Here lies the inherent, and hopefully enduring, aspect of journalism and storytelling. Now to protect this, we need to start a discussion.

MAK, Museum of Applied/Contemporary Art, Vienna

About Christoph Thun-Hohenstein

Christoph Thun-Hohenstein assumed direction of the MAK — Austrian Museum of Applied Arts / Contemporary Art on 1 September 2011. He was director of the Austrian Cultural Forum New York from 1999 to 2007, after which he served as managing director of departure — the Creative Agency of the City of Vienna, until August 2011. Christoph Thun-Hohenstein has published on topics dealing above all with European integration and with contemporary culture and art, and has held numerous lectures on these topics. He has also curated exhibitions of contemporary art, and he regularly serves on selection juries.


From tech to ethics: Will AI be a threat to journalism? was originally published in Global Editors Network on Medium, where people are continuing the conversation by highlighting and responding to this story.

The future of journalism is not all doom and gloom. Here’s why

The Reuters Institute for the Study of Journalism will be unveiling its 2017 Digital News: Essential Data on the Future of News report at the GEN Summit in Vienna, on 22 June. We talked to Nic Newman, author of the report, to get an early glimpse of what we can expect from it and what is going to shape the media industry in the foreseeable future.

David Levy, Nic Newman and Matt Kelly at the GEN Summit 2016

GEN: What trends are emerging this year and will be more prominent in the upcoming months for news?

Nic Newman: It’s been an extraordinary year for the news industry. Because of this perfect storm of fake news – and how to define them, business models and the growing realisation that platforms are not just platforms. Those three things together condition how we create journalism, how we distribute journalism. They show that we are really at an inflection point as an industry.

One of the things we do in the 2017 Digital News: Essential Data on the Future of News report, which we will be revealing at the GEN Summit, is that we get country reports from every country on the Reuters Institute supply side, giving us insights into what is going on in terms of jobs, journalistic jobs, and in terms of business models. The responses to that show an enormous strain. In Australia, for instance, with Fairfax losing 25 percent of editorial jobs. There are a lot of job cuts in journalism in quite a few countries but we are also seeing on the more positive side real innovation in business models. We are really starting to see a change there.

Last year I felt it was all quite depressing, it was a very depressing report to read. This year, there will be a few more moments of optimism, particularly around the inventiveness and creativity people are trying to take advantage of. Some of the content and the innovation we are seeing coming from some of our partners are extremely impressive.

So it is still a storm, it is still somewhat depressing but there are definitely more moments of ‘home-run’ in this year’s report.

What is going to be key for the future of news publications?

What is happening with platforms in general is key. Again, I am not going to give away the details of the report, because we want to reveal those details in June, but we are seeing a lot of change within the social networks space that is perhaps a bit hidden, we will explore this. We have seen for the past five years the growth and increasing power of Facebook specifically. We are now reaching a point of saturation within developed markets for the “old style” social networks, they are getting disrupted.

It is really about the role of platforms and algorithms. In terms of how people discover news, over the last five years we’ve seen this shift from the majority of people going directly to a news site and getting stories selected by an editor to many more people coming across (and then selecting) news via an algorithm.

This chart shows that across all countries editorial selection (direct, email, email notifications linked to an app) is now only just in the lead (52%) but for under 35s, who use more social media in particular we are now in a world where the majority of content is selected by an algorithm (55% compared with 43% for direct)

This is why the issue of who programmes the algorithms, the transparency of those algorithms and what kind of content they surface matter so much. We will be discussing this at the GEN Summit.

We will expand also lot on advanced messaging apps in the report, as for some of the new countries we are looking at this year, messaging is more powerful already than traditional social networks. We used to look at the US to understand what was happening and what was rising, now it would not make sense; Asia and Latin America are better examples for these emerging trends.

About the evolution of business models, do you see this affecting legacy media and digital pure players alike?

Absolutely. It is affecting everybody. What is interesting is that many of the pure players, many of them are only a few years old, are already becoming disrupted by changes in distribution models. So they you know they started out with one approach, which was very successful for a while, but they cannot assume this approach is going to continue to be successful, as every year something new changes the landscape, particularly in this disrupted world. If you take at what a few of the New York companies do on distributed models, then you will see this the year when the greatest change is happening.

From the Journalism, Media and Technology Trends 2017, Reuters Institute

On the legacy side what we are seeing is a major refocusing away from pure reach and pure numbers towards more subtle issues: “How do we create value? How do we get people to come back to our site more often?” One of the big elements that is in our report this year comes from the research we have conducted with focus groups, especially in Europe, on how consumers think about different new models paying for news. Not just paywalls but also some of the emerging models like micro-payments and bundle of different propositions, the aggregated ‘Spotify for news’ kind of ideas and we have been talking to consumers about how they would feel about some of those models.

What we found out is that we have gone from a world where everyone in the media assumed publishing was going to be funded by advertising, to one where everyone sensed that no publications can survive without a company paywall. And I believe that neither of those two are true. When we talked to consumers, it became obvious they cannot afford to pay for four or five digital subscriptions. What they really want is to be able to do what they are already doing in the digital world, navigating from one site to another. They love that and they do not want to go back to the actual world where they are forced, from a financial standpoint, to only get news from one provider.

So I don’t think that single stack paywalls are going to be the answer either. Ultimately publishers are going to think much more radically about how they combine those models into something that fits with what the consumer wants.

What will be the solution for publishers?

Everyone is realising that one single business model is not going to be enough. Essentially what is needed is, three, four or even five different approaches, meaning that publishers will be protected, to some degree, from a sudden down, a sudden change of an algorithm by Facebook for instance or a consequent downturn in terms of display advertising.

Having a more distributed model would help. We have been seeing increasingly over the last year, positive signs with publications starting to develop income streams around events, sponsored content or data. These make for different business models, every publisher will have a different approach.

To some extent these help, but above the business models, publications have to have real clarity on what they are about. The most successful media companies are very clear about what their fundamental strategy is. And then beneath that publishers need to iterate very quickly on those points, particularly if the fundamental strategy is about making money out of branded content for example and building distribution networks. If this was your strategy, you would have to change what you did, how your formats worked and which networks you were using.

You were mentioning a more optimistic outlook for the future of the news industry, what is affecting it?

I suppose this is what we will be discussing in June at the conference, which is the reason why everyone needs to be at the GEN Summit. As I described earlier, what we are seeing is partly because of the desperate state of the economic situation, with publications really counting now on business model innovation to move forward.

A very positive element being a source for hope is actually the fake news phenomenon. Before people everywhere felt that there wasn’t really a problem with news, from a consumer point of view. It was fine, it was all free, a lot of people thought they should never have to pay for it, which was part of the problem. What we are getting now is because of fake news, the general public has come to the realisation that journalism doesn’t come for free. There is good journalism, there is bad journalism, and there is quality journalism which cannot be found everywhere, it is actually quite scarce. It might be something people need to pay for. The increasing pollution of our news environments, which is I think is what is going on, is creating a situation where there is an opportunity for quality news brands or brands that have something to say to actually charge for their either directly or through a creative approach to advertising in the marketplace. I think this is the ray of hope I take from the whole ‘fake news’ debacle.

What is the next big disrupter, technology-wise, for news? How are immersive and other technologies faring in newsrooms?

For the first time this year for the report, we have asked about ‘voice’ – voice-activated devices such as Amazon’s Alexa, Echo or Echo Dot. People at the GEN Summit are going to be very surprised at some of the results we got around voice. Voice is going to be an incredibly important platform, and in the short term it is more important than wearables – glasses, watches etc. – for example which everyone got very excited about. Media companies need to take ‘voice’ seriously, as an emerging platform that is developing quite quickly.

The Amazon Echo Dot

So far voice-activated devices are sold in four countries, whereas for the report we take a look at 36 countries. In most countries those aren’t relevant yet but in countries we have surveyed voice for the first time, the extent of its usage not only in general, but for news specifically is impressive.

The obvious implication here is the rise of audio, beyond radio programs, or even the popularity for podcasts. ‘Voice’ as a new platform for news is only just emerging, it has just being launched. But in a five year perspective, it is going to be a major disrupter.

The VR for news report authored by Zillah Watson from the BBC, looks very widely at best practices in VR. Virtual reality, and any type of technology for immersive journalism, is a much longer term opportunity, it is going to take a long time to build. While we see a major potential in 360 journalism, there is a lot of experimentation going on around it at the moment. But there are quite a few barriers standing between the technology and innovations in 360 and implementation in newsrooms, it is going to take some time.

It could be years before we see the potential of VR, which I think is going to be very strong, but it is going to be different for ‘voice’, as it will be easier to implement, it will be really disruptive. Amazon is going to disrupt all kinds of business models including Google’s, since they mainly built their business on adjacent display in search. AR offers a lot of opportunities with huge bundle of technologies, bringing disruptions in almost every stage in the news valued chain.

From the Reuters Institute VR for News report, by Zillah Watson, BBC

About Nic Newman

Nic Newman is a journalist and digital strategist who played a key role in shaping the BBC’s internet services over more than a decade. He was a founding member of the BBC News Website, leading international coverage as World Editor (1997–2001). As Head of Product Development for BBC News he helped introduce innovations such as blogs, podcasting and on-demand video. He has played an important part in the development of social media strategies and guidelines for the wider BBC. Nic is currently a Visiting Fellow at the Reuters Institute for the Study of Journalism and a consultant on digital media.

Rich Harris—The Guardian US

“There’s been this trend in recent years towards trying to make journalism as digestible as possible, as easy to share as you can make it. And we wanted to do something that explicitly went against that trend.” (Journalism.co.uk, 18 August 2016)

Ben Kreimer—Buzzfeed

“Let´s try things out. Even if virtual reality journalism is not exploding in terms of hits right now, it pays to be a part of it. VR news is going…well, somewhere.” (The Media Online, 7 March 2017)

Amy Webb—Future Today Institute

“This isn’t to say that every single journalist needs to become a coder overnight, but I do think it’s important that news organisations understand what AI can and cannot do. I see a fairly big disconnect right now, with some organisations thinking that AI will eliminate all the reporters and others thinking that it will somehow magically allow them to write millions of stories.” (Journalism.co.uk, 13 December 2016)

Quotes brought to you by Storyzy


The future of journalism is not all doom and gloom. Here’s why was originally published in Global Editors Network on Medium, where people are continuing the conversation by highlighting and responding to this story.

Does your machine mind? ethics and potential bias in the law of algorithms

Organised by: The Law Society of England and Wales

Can you teach a robot to love? Can you teach a robot what is just? Is it possible to ‘compute’ right from wrong?

Or are we so preoccupied with whether or not we can, that we didn’t stop to think if we should?

As advances in Artificial Intelligence bring new computing capabilities off the pages of science fiction and into our daily lives, this seminar will examine what happens when the law, ethics and algorithms collide. Our panel of experts will consider three different perspectives on this exciting and emerging area.

First, the place of ethics in automated decision-making. How do you build appropriate ethics into algorithms, and what does this mean for transparency in decision-making? Do the gains really justify the costs?

Second, as AI systems become increasingly able to do tasks that only a human could previously, how do they fit with our existing ideas of rights? At what point do these rights start to resemble personhood? Will AI move from ‘computer’ to ‘colleague’?

Third, as we build these machines how do we keep them from reflecting or entrenching our own biases and inequalities?

Our panel will confront these questions, as well as questions from the audience, in a London Tech Week seminar that promises to push the electronic boundaries of our laws and morality.

Programme

17:30 – 18:00 Registration and refreshments

18:00- 18:50 Speaker presentations

18:50 – 19:20 Question and answer session

19:20 – 20:30 Drinks and canapé reception with demonstrations of innovative tech products

Event details

Want to bring automation to your newsroom? A new AP report details best practices

In 2014, the Associated Press began automating some of its coverage of corporate earnings reports. Instead of having humans cover the basic finance stories, the AP, working with the firm Automated Insights, was able to use algorithms to speed up the process and free up human reporters to pursue more complex stories.

The AP estimates that the automated stories have freed up 20 percent of the time its journalists spent on earnings reports as well as allowed it to cover additional companies that it didn’t have the capacity to report on before. The newswire has since started automating some of its minor league baseball coverage, and it told me last year that it has plans to expand its usage of algorithms in the newsroom.

“Through automation, AP is providing customers with 12 times the corporate earnings stories as before (to over 3,700), including for a lot of very small companies that never received much attention,” Lisa Gibbs, AP’s global business editor, said in a report the AP released Wednesday.

The AP’s report — written by AP strategy and development manager Francesco Marconi and AP research fellow Alex Siegman, along with help from multiple AI systems — details some of the wire’s efforts toward automating its reporting while also sharing best practices and explaining the technology that’s involved, including machine learning, natural language processing, and more.

The report additionally identifies three particular areas of note that newsrooms should pay attention to as they consider introducing augmented journalism: unchecked algorithms, workflow disruption, and the widening gap in skills needed among human reporters to produce this type of reporting.

To highlight the challenges of using algorithmic journalism, the report constructed a situation where a team of reporters covering oil drilling and deforestation used AI to analyze satellite images to find areas impacted by drilling and deforestation:

Our hypothetical team begins by feeding their AI system a series of satellite images that they know represent deforestation via oil drilling, as well as a series of satellite images that they know do not represent deforestation via oil drilling. Using this training data, the machine should be able to view a novel satellite image and determine whether the land depicted is ultimately of any interest to the journalists.

The system reviews the training data and outputs a list of four locations the machine says are definitely representative of rapid deforestation caused by nearby drilling activity. But later, when the team actually visits each location in pursuit of the story, they find that the deforestation was not caused by drilling. In one case, there was a fire; in another, a timber company was responsible.

It appears that when reviewing the training data, the system taught itself to determine whether an area with rapid deforestation was near a mountainous area — because every image the journalists used as training data had mountains in the photos. Oil drilling wasn’t taken into consideration. Had the team known how their system was learning, they could have avoided such a mistake.

Algorithms are created by humans, and journalists need to be aware of their biases and cognizant that they can make mistakes. “We need to treat numbers with the same kind of care that we would treat facts in a story,” Dan Keyserling, head of communications at Jigsaw, the technology incubator within Google’s parent company Alphabet. “They need to be checked, they need to be qualified and their context needs to be understood.”

That means the automation systems need maintenance and upkeep, which could change the workflow and processes of editors within the newsroom:

Story templates were built for the automated output by experienced AP editors. Special data feeds were designed by a third-party provider to feed the templates. Continuing maintenance is required on these components as basic company information changes quarter to quarter, and although the stories are generated and sent directly out on the AP wires without human intervention, the journalists have to watch for any errors and correct them.

Automation also changes the type of work journalists do. For instance, when it comes to the AP’s corporate earnings stories, Gibbs, the global business editor, explained that reporters are now pursuing different types of reporting.

“With the freed-up time, AP journalists are able to engage with more user-generated content, develop multimedia reports, pursue investigative work and focus on more complex stories,” Gibbs said.

Still, in order to use this type of automated reporting, newsrooms must employ data scientists, technologists, and others who are able to implement and maintain the algorithms. “We’ve put a lot of effort into putting more journalists who have programming skills in the newsrooms,” said New York Times chief technical officer Nick Rockwell.

The report emphasizes that communication and collaboration are critical, especially while keeping a news organization’s journalistic mission front and center. The report outlined how it views the role data scientists play:

Data scientists are individuals with the technical capabilities to implement the artificial intelligence systems necessary to augment journalism. They are principally scientists, but they have an understanding as to what makes a good story and what makes good journalism, and they know how to communicate well with journalists.

“It’s important to bring science into newsrooms because the standards of good science — transparency and reproducibility — fit right at home in journalism,” said Larry Fenn, a trained mathematician working as a journalist in AP’s data team.

The full AP study is available here.

My News Feed is Filtered? Awareness of news personalization among college students

.

Digital Journalism Vol. 0 , Iss. 0,0

Personalization algorithms, widely used by digital media sources, filter and prioritize news in ways that may not be apparent to users. Savvy media consumers should be aware of how this technology is used to tailor news to their tastes. This two-part study examines the extent to which US college students are aware of news personalization, and the actions and criteria that affect news selection and prioritization. Interviews with one set of students (N = 37) focus on the news sources they use most often to begin a news search. A subsequent survey given to a second set of students (N = 147) focuses on Google and Facebook, two influential gatekeepers. Results show that students are largely unaware of whether and how news sources track user data and apply editorial judgments to deliver personalized results. These studies identify aspects of news personalization that warrant greater attention in college curricula.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑