Filter Bubbles and Big Nudging: Impact on Data Privacy and Civil Society

The emergence of profiling or behavioural tracking, filter bubbles and big nudging in the Internet age not only infringes people’s privacy, but also prejudices their free participation in a civil society.

A filter bubble is a state of intellectual isolation resulting from the supply of personalised contents by websites to netizens. The personalised contents are based on profiling or behavioural tracking of netizens with the use of algorithms and their personal data. As a result, netizens may only see information that they are fed to see or they believe they want to see, or information that suits their interests, preferences or viewpoints. Consequently, netizens will get little exposure to differing viewpoints. Therefore they are intellectually isolated in their own information bubbles.

Filter bubbles are more common and their impact is more far-reaching nowadays than before because people increasingly rely on Internet search engines and online social media hunting for information. Internet search engines and online social media also have strong commercial incentives to make use of personalised contents to keep netizens staying on their platforms.

Filter bubbles can cause the formation of partial or even biased (if not radical) views. More importantly, people affected by filter bubbles may tend to believe that their views represent the objective reality or the majority view in that they are right, while the opposite views represent the wrong minority. Filter bubbles are not conducive and may be harmful to rational discussions in a civil society. Some academic research showed that filter bubbles may even increase ideological segregation and polarisation in a society (eg Flaxman, Seth and Goel, Sharad and Rao, Justin M., Filter Bubbles, Echo Chambers, and Online News Consumption (2016)).

Similar to filter bubbles, big nudging also involves the use of personal data. Nudge is a concept in behavioural science in which the behaviours or decisions of individuals are influenced by external forces in a subtle way, such as by providing positive reinforcement and suggestions. Big nudging is a way of nudge with the use of Big Data. Very often it is also supplemented by the use of profiling or even filter bubbles. For example, in a political campaign, the campaigners may use Big Data analytics to estimate what issues give rise to the most significant concerns in each demographic group. Then the campaigners use profiling to ascertain which demographic group each targeted individual belongs to, as well as evaluating his/her political views. Finally, the campaigners may deliver personalised propaganda materials to individuals with a view to responding to their concerns and nudging them towards supporting certain propositions.

Big nudging may be risky to a civil society because people may wrongly believe that they are acting in their free wills, but in fact they are being nudged, undermining the values of a society. It is widely speculated that the results of the Brexit referendum and the US presidential election in 2016 might have been strongly affected by big nudging.

Both filter bubbles and big nudging involve (sometimes irresponsible) use of personal data. They do not bode well with our usual expectations of accountability and transparency in data processing. In mainland China, Art. 18 of the E-Commerce Law provides that when an e-commerce operator makes recommendations on goods or services to consumers based on the consumers’ preferences or habits, the operator should also provide the consumers with non-personalised options. Further, pursuant to Art. 23 of the Data Security Management Measures (consultation draft) published by the Cyberspace Administration of China in May 2019, network operators have to label personalised recommendations of news and commercial advertisements, and to allow individuals to opt out from receiving personalised information. These regulatory measures in the mainland are vivid examples demonstrating how the negative effects of filter bubbles and big nudging may be reduced.

All in all, with the increasing use of data for profiling or behavioural tracking, filter bubbles and big nudging in the data driven economy, the conventional wisdom of personal data privacy manifested through data minimisation, use limitation, purpose specification, transparency, accountability, etc., still holds water. As technology is increasingly more pervasive in all aspects of our lives, we have to be even more vigilant about our personal data.


Barrister, Privacy Commissioner for Personal Data, Hong Kong