Facebook has fomented extremism for years. The phenomenon was well known within the company—yet the executives refused to stop it.
Our brains are attracted to divisive content, such as clear-cut opinions. Facebook’s algorithms take advantage precisely of this weakness, because it generates more activity among users—and therefore, more revenue for the company.
Recommendation algorithms are convenient because they generally present us with content that we like or that we already agree with. However, they pose the risk of creating a filter bubble, a small “personalized” universe which prevents us from being exposed to other ideas, opinions, or realities. This allows homogeneous communities to grow in isolation, a fertile condition for the emergence of extremist groups.
Personalization is at the core of Facebook’s business, but it comes with the cost of social division, political polarization, and the rise of extremism.