Why Social Media’s Echo Chambers Won’t Happen on AI-Powered News Sites

Posted by

Social media has a powerful pull on people: we stay in touch and feel connected to family, friends and communities that are important to us. But when it comes to news, social media’s hyper-personalized algorithms revealed their dangerous side: echo chambers, information bubbles, and the rapid spread of fake news. The algorithms that made social media so effective at keeping people engaged also created these very problems.

The way to fight rampant misinformation and improve online discourse is the very thing journalists and editors work hard to achieve: Get people to spend more time with high quality news sites but without the “engagement at all cost” approach of social media.

By marrying a version of the personalization technology that makes social media feel so relevant with the skills, craftsmanship and values of journalism, news sites can re-capture the kind of engagement with their content that improves discourse and supports civic life in our communities.

The truth about these algorithms, a truth that big tech has been reluctant to own, is that those algorithms are not agnostic. They reflect the goals, experiences and values of those who create them. News publishers have an opportunity to develop tools that more explicitly reflect the goals and values that have historically helped guide decisions about news.

An example of the kind of values journalists aspire to was made explicit nearly 90 years ago by one publisher. In 1933, when Eugene Meyer bought the Washington Post out of bankruptcy, he established a set of principles he hoped his newspaper would achieve: “To tell the truth, as nearly as the truth may be ascertained;” that the “newspaper’s duty is to its readers and the public at large;” and that “in the pursuit of the truth, the newspaper shall be prepared to make sacrifices to it’s material fortunes, if such course be necessary to the public good.”

The way social media developed its algorithms reflect no such values. Instead, social media algorithms are tuned only to increase the time users spend on the site or app without regard to the truth, accuracy, or impact of the content on those readers.

On a typical day, 4.75 billion pieces of content are shared on Facebook. There is very little vetting of this ocean of content so the onus of trusting what you read on Facebook is entirely you. While the share from your aunt about your cousin’s birthday party is quite trustworthy, that content your uncle shared about an election, a vaccine or a “special military operation” is harder to judge.

Not surprisingly, according to a recent Gallup poll, social media is the least trusted source of news about the world. Paradoxically, social media is also the most relied upon source for the news.

News publishers have some inherent disadvantages in capturing the same level of engagement that sites like Facebook have captured, especially when it comes to technology development. But those same publishers’ greatest asset, their newsrooms and editorial operations, are exactly what our world needs today. Not surprisingly, while these publishers are not the most relied upon for news, they are the most trusted. Building that engagement is where smart application of AI & personalization comes in.

For AI to effectively personalize news sites and increase reader engagement, it’s crucial that editors be involved at every step of the process of designing the algorithms and UX.

With “Homepage for You,” our newly released homepage personalization tool, we did just that. Our product, engineering and data science teams worked with senior editors at some of the world’s biggest news sites, incorporating their expertise and many cautionary warnings into the very foundation of the technology. We knew that earning editors’ trust would be the first step to earning readers trust and by extension, readers’ engagement.

Some of the things that came out of this effort include:

  • Tools for editors to review and moderate algorithmic decisions that “teach” the algorithm about editorial decisions.
  • An interface to disable personalization in specific parts of the homepage (any or all) when the news cycle warrants more closely curating particular homepage content. The breaking news cycle over Ukraine is an easy example of when these editorial controls are essential. Other examples could include times when major news projects are published, when local natural disasters strike, or other areas of editorial focus, like climate change.
  • Editorial teams create “definitions” about content mix so that personalization won’t remove or hide content areas. This happens regardless of personalized interest in those areas. For example, coverage of a local sports team would still be included on a local site, even to people who may only have a slight affinity to sports content.
  • Focus on originally-reported content over aggregated and syndicated sources. This is in line with data showing that readers trust their chosen news source over social media and other news aggregators.

The news industry continues to face significant challenges competing with social platforms, but many publishers are ready to tackle those challenges. Recent research from Reuters shows that 85% of publishers globally say that AI will be important this year in delivering better personalization and content recommendations for consumers.

In an age where personalization is the new norm for digital experiences, it can be hard for news sites to make sure they’re capturing user attention in a safe way. But, with effort and discernment, strong personalization strategies can help readers discover things they’ll find new and interesting about their community and the world.

Taboola’s Homepage For You is one example of a breakthrough solution to help balance this new personalization status quo with the need to ensure a balanced and accurate exchange of information. If you have any questions, reach out to us here.

Get Your Taboola Feed Today!