Louis Barclay
Study: Hide Sticky Features

Study: Hide Sticky Features

 
Imagine a world where infinite feeds and toxic recommendations are hidden. Where the ‘sticky’ features that cause the largest amount of engagement — and therefore harms — of big tech are opt-in, instead of being forced on users. What does that world look like? How does it reduce online harms like misinformation and addiction? What happens when we place big tech’s cigarettes behind the counter, instead of within easy reach on the shelves? We don’t have to imagine this world — we can build it, and test it. These features are simply bits of HTML. We can hide them, and study what happens when we do. Then we can tell legislators about this world, and work with them to make it a reality.

Summary

I’m seeking funding to study the effects of sticky features used by tech platforms, like infinite feeds and recommended content, by running a continuous experiment where random users will have these features hidden.
Specifically, I’ll be studying the effects of these sticky features on time spent, subjective wellbeing, and exposure to mis/disinformation.
The intention is that this research will lead to policy proposals for regulation that addresses sticky features in a targeted way.

Context

  • Sticky features designed to keep users hooked, like recommended content and infinite feeds, are responsible for many of the greatest harms of tech platforms, such as misinformation and addiction.
  • However, there is not enough research at the level of specific sticky features. We tend to talk about Facebook as a whole being bad, not about the News Feed specifically being bad. We do talk about YouTube recommendations being bad, yet we don’t have studies comparing randomized groups of YouTube users who do/don’t have recommendations showing.
  • We should be imagining a world where we:
    • Hide these features. Make them opt-in by default.
    • Study them in isolation to understand their harms. We need to split them out so we can talk authoritatively about their effects.
    • Test the effects on users of them not existing. It’s important to talk about tweaking news feed algorithms to make them safer, but what if, by default, the news feed didn’t show? Good research on sticky features must start with randomized controlled trials, with the experiment group subject to an intervention to hide or alter the sticky feature.
  • This split out research will help us work towards recommending targeted regulation to make these features safer, such as feeds or recommended content having to be hidden by default, adequately labelled, or altered in some way to be less harmful. Split out research for each sticky feature will lead to more concrete policy recommendations.

Proposal

  • I am looking for funding for one year to:
    • Turn Nudge into a continuous research tool that A/B tests the impact of key sticky features not existing, or being hidden by default.
      • Users will be randomly shown, or not shown, key sticky features such as YouTube recommendations or the Facebook News Feed.
      • Users’ time spent browsing will be collected with their consent, and users will report subjective wellbeing and exposure to mis/disinformation.
      • Users will be participating in a public interest experiment to evaluate the harms of sticky features, and by extension the harms of the AI that drives them.
  • Funding will also go towards building up a community of developers and volunteers to contribute to Nudge’s database of sticky features.
    • Nudge already has 139 stars on GitHub (developers who follow it) and numerous developers who have historically been interested in contributing.
  • I have previously conducted a similar study with University of Neuchâtel, Switzerland, focusing specifically on the Facebook News Feed, which is in the final stages of being completed and should be published later this year.
  • This time I intend to open source the data I gather so that it can be used freely by any academic institution.

About me

  • 6 years developing free software to counter the harms of tech platforms.
  • Founder of various tech startups, including Stacker (went on to be funded by a16z) and Cloakist (exited in March 2022).

Note

  • If it will increase the impact and reach of this research, I am open to conducting it under a different brand, as part of an existing organisation, or under an existing tool such as Mozilla Rally.