Product Highlights

Social media’s history is not yet written, and its effects are not neutral. It is tied up in the richness and complexity of social life. As its builders we must endeavor to understand its impact — all the good, and all the bad — and take up the daily work of bending it towards the positive, and towards the good. This is our greatest responsibility.
— Chris Cox, 2019

In 2016, Facebook revealed its tri-prong approach to mitigating harm on News Feed - Remove, Reduce, and Inform. During my time on Facebook App Integrity, I primarily focused on conducting research that helped shape our approach to the latter two prongs - “Reduce” and “Inform.” I played a critical role in expanding our Integrity UI footprint, reducing user bad experiences, and increasing user understanding of the content they consume and share.

Adding Context

During my time at Facebook, I served as research lead and research manager for our Facebook Integrity UI products. As such, I developed a robust research agenda and conducted multiple studies that ultimately informed a suite of products, with informative friction at the center.

(Excerpted from previous writings) Drawing from multiple internal research studies, both qualitative and quantitative in nature, we identified three central people problems, that at first glance may appear in conflict:

  1. Users want social media companies to do more about integrity harms.

  2. Users want to feel more empowered and more in control of their social media experience.

  3. Users want the opportunity to assess information for themselves, but they also want to have the relevant information to do so.

Further, from extensive qualitative work, particularly in low-literacy communities, we were aware that some users faced challenges assessing information and identifying low-quality content. As social media platforms were made simpler and sleeker, these challenges only increased. Thus, there's a need to communicate more directly and obviously about potential harm. Through this, we determined that a simple front-end intervention that empowered users and gave them insight into how and what we considered harmful would be extremely beneficial and high in user value. As such, we launched our “out of date news” interstitial in June 2020.

Over time, we expanded these interstitials to include more potential triggers such as;

We also expanded these interventions to new surfaces including providing users with context when they attempted to join a group that had repeatedly violated the Community Standards (Announcement | News Coverage), continuously empowering users to make more informed choices about the content they consume and engage with on social media.


Ensuring Safe Elections

During my time at Meta, I repeatedly leveraged my subject matter expertise in American politics, electoral systems, and political behavior to help shape support election work around the globe.


Reducing Bad Experiences

Core to my time at Meta was working to better understand user harms and bad experiences that live outside and adjacent to the Facebook Community Standards. In doing so, me, and my team, sought to deeply understand user perceived harms and bad experiences (ranging from things like clickbait and information quality to gross content to potential nudity) and introduce appropriate interventions whether that was at the user or community level. This work resulted in the introduction of more personalized ranking, as well as new tools like the new x out control to hide unwanted content.

  • Read more on the Content Distribution Guidelines, which describe FB’s approach to demotions here.

  • Read more about how we seek to address more personal bad experiences through personalization described here.

  • Enforcement and measurement approach discussed here.


Expanding Controls

In 2021, Nick Clegg (President of Global Affairs, Meta) wrote an op-ed describing how algorithms and/or recommender systems were two way streets; a conversation between the user and Facebook. While this has long been true, he, and we, were well aware it often did not feel like this. As such, we worked to address this disparity through building additional, often lighter-weight, user content controls. In building user controls, we sought to increase user understanding and transparency into our system, empower users, and reduce their bad experiences on FB. Here, I led the research team in more deeply understanding the “sense of control” people problem, reimagining existing controls, discovering new product solutions, and supporting the end to end product development cycle of multiple products, some seen above.

  • Read more about some of those changes here.