How Facebook got addicted to spreading misinformation
Karen Hao, MIT Technology Review:
In 2017, Chris Cox, Facebook’s longtime chief product officer, formed a new task force to understand whether maximizing user engagement on Facebook was contributing to political polarization. It found that there was indeed a correlation, and that reducing polarization would mean taking a hit on engagement. In a mid-2018 document reviewed by the Journal, the task force proposed several potential fixes, such as tweaking the recommendation algorithms to suggest a more diverse range of groups for people to join. But it acknowledged that some of the ideas were “antigrowth.” Most of the proposals didn’t move forward, and the task force disbanded.
The metrics a business chooses to measure its success drive decisions and define its product. You can use all the marketing words in the world to frame your work in a different light, but eventually, it‘s always about what you measure.
If you‘re optimising user engagement to make people watch episode after episode of a TV series, then you’re selling TV shows. If your product automates brokering loans to small businesses, you’re in finance. If you’re building a product that optimises engagement, so users see and click more ads, then you’re in advertising. You‘re not building communities or bring the world closer together. And you surely aren’t making the world a better place.