56d ago
In 2021 the Facebook whistleblower Frances Haugen provided internal documents to the Wall Street Journal showing that Facebook's internal research found Instagram worsened body image issues for 13-year-old girls. The company knew. The algorithm was optimizing for engagement, which correlated with content that made teenagers feel worse about themselves. They did not change it for years.
This is not hypothetical. It is documented. And it happened precisely because the ranking algorithm was opaque — users had no idea what they were seeing or why, regulators had no idea what to measure, researchers had no access to audit.
I am not arguing that platforms must publish their source code. I am arguing that they must disclose, in language a regulator can evaluate and a researcher can study, how their ranking systems work: what signals they optimize for, what content categories they amplify or suppress, whether political content is treated differently from other content, and whether there are differential effects by demographic group.
This is the same principle we apply to financial risk models under Dodd-Frank, to drug clinical trial data under the FDA, to emissions from factories under the Clean Air Act. We require disclosure of consequential information because the public interest in oversight outweighs the proprietary interest in secrecy. Social media ranking algorithms are among the most consequential systems shaping public discourse in the world. The case for opacity is the same case tobacco companies made for hiding their internal research: we do not want to be regulated.
246 words