Jeremy Singer-Vine is the data editor at BuzzFeed News. He also publishes Data Is Plural, a weekly newsletter of useful/curious datasets. Contact information available at jsvine.com
November 18, 2020 04:00 PM PT
Digital Forensics Pioneer, UC Berkeley
We have all been the subject of a predictive computer algorithms of the form: "If you liked X, then you may like Y." Music and movie streaming sites and on-line shopping sites routinely analyze our previous listening, viewing, and shopping habits, compares them with other users, and then makes recommendations for us. Many of us have been subjected to a predictive algorithm of the form "If you are like X, then we may not give you a loan or a job."
Banks and employers routinely make lending and hiring decisions based on comparing your personal attributes with those of others. And, if you've recently had a run-in with the criminal justice system, you may have been subjected to a predictive algorithm of the form "If you are like X, then you may go to jail."
These predictive algorithms, trained on historical data, have been accused of reflecting and amplifying past racial and gender injustices instead of, as they are intended, removing them. We evaluate the claim that predictive computer algorithms are more accurate and fair than people tasked with making similar decisions. We also evaluate, and explain, the presence of racial bias in predictive algorithms used in the criminal justice system.
Data Editor, BuzzFeed News
BuzzFeed News data editor Jeremy Singer-Vine will describe the data efforts that underpinned the FinCEN Files — the recent investigative collaboration between BuzzFeed News, the International Consortium of Investigative Journalists, and more than 100 partner news organizations around the world. Using the FinCEN Files as a point of reference, he'll also discuss several ways in which the goals and techniques of data-for-journalism differ from data work in other fields.