How do we deal with algorithms in the attention economy?


How do we deal with algorithms in the attention economy?


Date: August 26 (Thu.) 2021, 14:00-16:00



  • Chen-Chao Tao, Chair, Taiwan Academy of Information Society


  • Cheng-Te Li, Associate Professor, Institute of Data Science, National Cheng Kung University  
  • Shao-Man Lee, Visiting Assistant Professor, Miin Wu School of Computing, National Cheng Kung University 
  • Ming-Syuan Ho, Project Manager, Research Center for Information Technology Innovation, Academia Sinica

Session details

Famous for coining the term ‘net neutrality, Tim Wu, an author, lawyer, activist, and professor at Columbia Law School, published his book titled ‘The Attention Merchants: The Epic Scramble to Get Inside Our Heads’ in 2016. In the book, Wu delineated how the tech companies became the newest heads of attention merchants by cultivating and harvesting our attention with the help of algorithms and customized advertisements. According to this argument, users are constantly facing trade-offs between their attention and free internet services.  People have argued that the newest form of attention economics, where the tech companies and social media platforms cultivate our attention by personalized contents and sell them to other businesses to make money, is what leads to today’s divided society and growing hate crimes. How do we address this issue? American technology ethicist Tristan Harris advocates that the economic principle of maximizing net profits should evolve along with the impact technology poses on humans and our environment. In other words, the tech giants should rethink what it means to ‘grow the business.’ Should Harris’s idea be realized? Or is this going to slow technology innovation? 


Economics is the study of how scarce resources are allocated; whether that is housing, food, or money. However, in an era of endless amounts of information at the hands of our fingertips, what is the scarcity? Unlike the first three examples that can be empirically quantified and measured, our intangible yet extremely valuable attention is the limiting factor: we are in the age of the attention economy[1].

The term “attention economy” was coined by psychologist, economist, and Nobel Laureate Herbert A. Simon. According to Simon, attention is the “bottleneck of human thought,” limiting both what we can perceive in stimulating environments and what we can do. He also noted that “a wealth of information creates a poverty of attention.” In 1997, theoretical physicist Micheal Goldhaber warned that the international economy is shifting from a material-based economy to an attention-based economy, pointing to the many services online offered for free.

The session’s moderator, Chen-Chao Tao’s opening echoed Goldhaber’s sentiment. Online services such as search engines, social media platforms, and instant messaging apps have drastically changed the ways people receive information. Tao pointed out that the tech companies are no longer mere Information and Communication Technology service providers. The truth is that some of the tech giants—particularly Google and Facebook—have literally been playing the role of media while consistently denying the fact.

Tao soon moved to introduce the panelists. The first panelist was Cheng-Te Li, Associate Professor of Institute of Data Science at National Cheng Kung University. From a data scientist’s point of view, Li shared his perspective on how we deal with algorithms in the age of attention economy.

Using YouTube as an example, Li illustrated how our daily life is fully immersed in the attention economy. The recommended videos we see on our YouTube homepage are the calculated results of our view history on YouTube and browsing history from other websites. The same applies to Facebook; most contents we see on our personal feeds, though ostensibly from those we follow, are recommendations pushed to us by Facebook and its algorithm. We are surrounded by recommendations in our daily lives, and there are hardly any ways to escape.

The American Psychological Association defines attention as “a state in which cognitive resources are focused on certain aspects of the environment rather than on others.” Although theoretically unquantifiable, many derive attention’s value from how much time we focus on a particular thing. We face attention’s scarcity every day; while “paying attention” to one thing we ignore others.

The academic has been conducting researches on attention for a long time, Li explained; economists, sociologists, and psychologists are all interested in how our attention works and how it can and has been manipulated. One of the most prominent challenges researchers in the past had with studying attention is collecting data. The data of what and how people choose to pay attention to, the user preference, was mainly collected by hardcopy surveys. The problem with surveys is that the data collected was heavily influenced by user bias. Another downside is that it is difficult to collect user data on different topics with only one survey.

The Paradigm Shift of the studies on attention happened when the Internet transformed our daily lives. Tech companies can easily collect abundant user data by following users’ digital footprint across the Internet and analyze the data using machine learning technology. On the one hand, they utilized the analysis themselves to profile and categorize the users. On the other hand, they can also make money by selling advertisement products based on the analysis to businesses who want to better target their potential customers.   

Li argued that there are 4 types of business model depending on how companies are using the user data. The first is e-commerce, where the companies collect data of not only purchase but also clicks. The second is content sharing platforms such as YouTube, Pinterest, and Instagram; platforms recommend channels and people to follow based on user’s subscriptions and searches. The third is social media platforms where they devise your feeds by AI-picking and implicitly recommending you contents from people you follow. Finally, online forums such as IMDb and TripAdvisor also rely heavily on user data.

Research has shown that microtargeting is extremely lucrative. It is no wonder companies keep collecting data despite growing awareness of the protection of personal information and privacy. How do users counter this situation?

Li had 3 suggestions. Firstly, users have to understand that their personal data is extremely valuable. In other words, personal data, including non-private and non-sensitive data, has become an important personal property. Users should be more aware of this fact and know well the right they have over their data.

Second of all, we need to learn and remind ourselves of our own cognition bias and the unwanted consequences of algorithms, whether it is the echo chamber where we only hear opinions we want to hear or the bandwagon effect where the crowd easily hop on and abandon yet another ‘trend.’ The best way to get your information is to get them from multiple and various channels; fact-checking and better media literacy are also critical in guarding ourselves against the offense of attention economy.

Last but not least, Li noted that attention does not have to lead to action. Companies and merchants might easily catch our attention by showing us things we are interested in, but that does not mean we have to take action at every trigger. Li suggested that users can always do their own research, make comparisons and evaluate before purchasing. A hard think of whether you need this particular product is also a good way of preventing impulsive purchases invoked by targeting ads.

In the end, Li reiterated that algorithm is a double-blade sword. In the attention economy, we essentially are trading our privacy for convenience. As suggested earlier, as long as the users are conscious of the negative aspect of the algorithm behind most online services, as well as being able to filter through the pseudo-recommendations, they can still appreciate and fully enjoy the benefit of the Internet, connecting to people and accessing information from faraway places otherwise unreachable thanks to the modern technology.

The second panelist, Shao-Man Lee, also teaches in NCKU. Coming from a legal background, Lee shed light on the topic of attention economy from a very different angle than the previous panelist. Lee’s argument was based on 2 premises: technology is never neutral, and any discussion of individual behaviors cannot be carried out without taking into account the broader historical, social, and economic context.

One of the most important points Lee raised during her presentation was that we should no longer ignore the harm attention economy inflicts on democracy and free speech.

When Micheal Goldhaber warned that the economy is changing from a material-based economy to an attention-based economy, he also rejected the characterization of ‘information economy’ instead of ‘attention economy.’ After all, attention is the scarce resource, not information.

Swiss economist Josef Falkinger argues that the scarcity of attention is a function of an information-rich economy. In other words, attentional scarcity is contributed by too much information competing for our limited attention. Another unwanted consequence of attentional scarcity is ‘reverse censorship,’ a technique of speech control where the regime distorts or drowns out disfavored speech through the creation and dissemination of fake news, the payment of fake commentators, and the deployment of propaganda robots.

We have to also learn to distinguish between consumer sovereignty and political sovereignty. Lee argued that consumer sovereignty is a myth in the attention economy; consumers thought they are the ones who ‘like’ and ‘decide’ to buy things, while a considerable part of their preferences and purchase was actually shaped by the recommendations they get from the web.

What is worse is that in the attention economy, our political choices are increasingly influenced by our ostensible consumer sovereignty. Politicians begin to imitate what merchants do to attract consumer attention: they collect user data to understand their preferences and tailor their talking points and campaign to cater to the target audience. A healthy political environment is where all citizens make informed choices using their votes while reaching compromises by having lively and open public discussions. This has changed because of how political sovereignty is jeopardized by the twisted consumer sovereignty.

Some solutions have been proposed and implemented. Twitter, for example, stopped accepting political ads in 2019. Google’s move was not as decisive, but the company did stop giving advertisers the ability to target election ads using data such as public voter records and general political affiliations. Some measures were only taken during the election time. Facebook suspended all ads concerning social issues, elections, or politics during the American presidential election. Google barred all kinds of election ads during Taiwan’s presidential election from 15 Nov. 2019 to 17 Jan. 2020.

Although it is encouraging to see the tech companies finally taking action, Lee contended that there’s still a void when it comes to either an ethical code of conduct or regulations. She was not in favor of letting the tech self-regulate, arguing that regulatory developments were also critical in achieving a more open, democratic, and accountable public space. 

The last panelist was Ming-Syuan Ho, who recently joined Research Center for Information Technology Innovation in Academia Sinica after working 6 years in Taiwan Association for Human Rights (TAHR).

During his time in TAHR, Ho was the manager of Taiwan’s Internet Transparency Report project. They surveyed and tracked whether and how the government censors content provided by Internet service providers (ISPs) or the government’s request for personal information from these providers.

In an effort to avoid repeating what his predecessors have discussed thoroughly in the session, Ho focused his presentation on how to develop a more transparent algorithm.

The Santa Clara Principles result from a collective effort of NGOs and digital rights groups to increase the transparency and accountability of content moderation. It proposed 3 principles:

  1. Numbers: companies should publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines.
  2. Notice: Companies should provide notice to each user whose content is taken down or account is suspended about the reason for the removal or suspension.
  3. Appeal: Companies should provide a meaningful opportunity for timely appeal of any content removal or account suspension.

Ho believed that these principles would help build a more meaningful transparency when it comes to algorithms. He quoted Fung, Graham, & Weil to illustrate the vision of meaningful transparency: ‘for transparency to be meaningful, it has to be targeted—not just increasing information, but communicating in a way that can be used to help hold decision-makers to account’.

He also touched on the transparency of data for research. Referencing the news of Facebook threatening to sue advocacy group Algorithm Watch unless they stopped using tracking tools on Instagram to monitor politician activities. Matthias Spielkamp, executive director of Algorithm Watch, had no choice but took down the tracking tool even though the data they have been collecting were purely for research purposes.

It is indeed ironic how Facebook, who is rapacious when collecting user data on its own platforms, suddenly started to care about ‘protecting user data.’ Due to the time constraint, Ho did not have the time to dive deeper into the legitimacy and accountability of data collection for research purposes. He did, however, made the final point that the discussion of algorithms and how we use data is entitled to a lot more attention from the Taiwanese society.

[1] Ally Mintzer. Paying Attention: The Attention Economy – Berkeley Economic Review.

Scroll to Top