New Approaches to Designing the News

Service design understands the role of the user and what they seek to experience in their journey. Policymakers should look to service design to innovate policy and address social issues.

How can we use design to create news and disperse media stories that are less biased, truthful and still readable?

Social media plays a pivotal role in shaping people’s education and awareness of most global and national events. When it’s used to showcase news, it’s a good thing, but when fake news dominates social media streams, misinformation can become dangerous.

We explore three possible design solutions to the Fake News issue.

Facebook — and other social media platforms — may insist that they are, at heart, a tech company and not a publisher, but to ignore Facebook’s capacity to inform people both offline and online would be to misunderstand their power in today’s media. There needs to be a better way to share and curate the news feed.

Traditionally we have relied upon publishers to perform fact-checking so we can rely upon sources of information without having to do it ourselves. Mastheads such as the New York Times and the Wall Street Journal have their own individual agendas, but generally we take it as a given that information that is presented in these publications are fact-checked and we call journalists and editors out for not being thorough in their efforts.

Fact-checking becomes even more pertinent when the President-elect of the United States Donald Trump has been prone to post tweets which are later revealed to be false. His Twitter handle has over 15 million followers.

So we came up with some ideas for designing a better news feed:

Design Concept 1: Use Artificial Intelligence to Fact-check News Stories

News stories have traditionally been validated by an editorial team. It’s becoming increasingly noticeable in a digital age that fact-checkers are more important than before. Especially so with the rise of fake and amateur news sites where stories can go viral yet remain unsubstantiated. Using artificial intelligence might prove handy as it can perform routine validations and checks on stories based on the authors, the websites or whether the media is certified — and requires less oversight by people.

In October 2016, Google added a “fact-check” tag to its popular Google News service. It means that the company recognised the need for an automated service that would verify the authenticity of a news story and source. While it doesn’t eliminate fake news from the news feed altogether, it does give it a layer of scrutiny and allows users to make up their mind about the data and its source.

We suggest that Facebook and all other publishers take a similar approach. This is a simple tool that could be attached to the interface of Wordpress (or even medium) that uses natural language processing to pull out phrases that may be topical. You need to start somewhere, so why not let these well resourced and ever expanding teams do the work for us as fact checkers? How many people do you need designing self driving car algorithms anyway?

Design Concept 2: Use Algorithm and Editors Concurrently

Facebook has traditionally relied on an algorithms to generate the lists of Trending news items. This algorithm is an automated process that ranks articles in terms of “newsworthiness”, meaning its popularity in terms of views, click and shares. It’s a pretty powerful tool considering it shapes the reading lists of millions.

Facebook has up until recently had an editorial team who would write descriptions for its trending stories. There were questions as to whether some news stories were being added manually by editors or whether they were politically bias based on the descriptions written for certain news items. As such, Facebook promptly retired its editorial team permanently.

Is an algorithm the right way for editing and bring news to the forefront? Is “newsworthiness” merely the articles highest shares and clicks? At this stage, who can deliver what the users want in terms of truthfulness? From what it seems, the best scenario occurs where the balance is struck. An algorithm is helpful because it can perform a mass sweep of popular articles, but using the oversight of an editorial team who could also act as a fact-checking unit.

Facebook is turning to outside groups for help in fact-checking. It is also exploring a product that would label stories as false if they have been flagged as such by third-parties or users, and then show warnings to users who read or share the articles, similar to that of Google.

Design Concept 3: Create a Metric to Measure Bias

Facebook has long relied on users to flag objectionable content, including fake news. However, relying on people to classify what is and isn’t misinformation isn’t always trustworthy: after all, there’s no uniform standard for ‘truth’.

However, having company oversight isn’t always the best means of addressing discontent either. Where Facebook often uses their ambiguous ‘community standards’ to determine whether something gets removed or not, the results are inconsistent.

For design to work properly, the process needs to serve the user. A better solution would be to introduce a metric where users can rate and assess news they believe to be bias. It should sit next to the “Like” and “Comment” button. At the moment, the only option users have is to “report” content, but there is too much stick when it comes to this form of user interaction. By inserting their own opinion into the mix, news can be flagged to friends or communities as potentially containing prejudice — so users take news with a grain of salt and make up their own mind.

Social media is a service designed to reach millions, but its purpose isn’t to create content — but to share it.

By attempting to take on the role of an editor, publisher and communication channel, Facebook ends up eroding the distinction between what the user wants (trustworthy news and forms of communication) and what the user is getting (misinformation affecting online and offline communication).

We believe that the technology needs to be the tool, not the creator of content itself. The power still rests with the user.

Portable has been at the leading edge of innovation in design, user experience, service design and delivery for over a decade. We use Design Thinking Principles to develop and produce products that are viewed and used by millions.

More

Sign up to our email newsletter to get updates about our events, work and research

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.